3 research outputs found

    Radio Location of Partial Discharge Sources: A Support Vector Regression Approach

    Get PDF
    Partial discharge (PD) can provide a useful forewarning of asset failure in electricity substations. A significant proportion of assets are susceptible to PD due to incipient weakness in their dielectrics. This paper examines a low cost approach for uninterrupted monitoring of PD using a network of inexpensive radio sensors to sample the spatial patterns of PD received signal strength. Machine learning techniques are proposed for localisation of PD sources. Specifically, two models based on Support Vector Machines (SVMs) are developed: Support Vector Regression (SVR) and Least-Squares Support Vector Regression (LSSVR). These models construct an explicit regression surface in a high dimensional feature space for function estimation. Their performance is compared to that of artificial neural network (ANN) models. The results show that both SVR and LSSVR methods are superior to ANNs in accuracy. LSSVR approach is particularly recommended as practical alternative for PD source localisation due to it low complexity

    Professor Anthony P. F. Turner: An innovative educator and pioneer of biosensors in the 21st century (On his 60th birth anniversary)

    No full text
    In the pharmaceutical industry it is common to generate many QSAR models from training sets containing a large number of molecules and a large number of descriptors. The best QSAR methods are those that can generate the most accurate predictions but that are not overly expensive computationally. In this paper we compare eXtreme Gradient Boosting (XGBoost) to random forest and single-task deep neural nets on 30 in-house data sets. While XGBoost has many adjustable parameters, we can define a set of standard parameters at which XGBoost makes predictions, on the average, better than those of random forest and almost as good as those of deep neural nets. The biggest strength of XGBoost is its speed. Whereas efficient use of random forest requires generating each tree in parallel on a cluster, and deep neural nets are usually run on GPUs, XGBoost can be run on a single CPU in less than a third of the wall-clock time of either of the other methods

    Extreme Gradient Boosting as a Method for Quantitative Structure–Activity Relationships

    No full text
    In the pharmaceutical industry it is common to generate many QSAR models from training sets containing a large number of molecules and a large number of descriptors. The best QSAR methods are those that can generate the most accurate predictions but that are not overly expensive computationally. In this paper we compare eXtreme Gradient Boosting (XGBoost) to random forest and single-task deep neural nets on 30 in-house data sets. While XGBoost has many adjustable parameters, we can define a set of standard parameters at which XGBoost makes predictions, on the average, better than those of random forest and almost as good as those of deep neural nets. The biggest strength of XGBoost is its speed. Whereas efficient use of random forest requires generating each tree in parallel on a cluster, and deep neural nets are usually run on GPUs, XGBoost can be run on a single CPU in less than a third of the wall-clock time of either of the other methods
    corecore